2,513 research outputs found
Modelling the Dynamic Relationship between Systematic Default and Recovery Risk
Default correlation modelling is becoming the most popular problem in the
field of credit derivatives pricing. An increase in default risk would cause the
recovery rate to change correspondingly. Correlation between default and
recovery rates has a noticeable effect on risk measures and credit derivatives
pricing.
After an introduction, we review the most recent literature covering default
correlation and the relationship between default and recovery rates. We
adopt the copula methodology to focus on estimating the default correlations
rather than focus on modelling probabilities of default, we then use stress
testing to compare the distributions of the probability of default under different
copula functions. We develop a Gamma-Beta model to link the recovery
rate directly with the individual probability of default, this is instead of an
extended one factor model to relate them by a systematic common factor.
One factor models are re-examined to explore correlated recovery rates under
three distributions: the Logit-normal, the Normal and the Log-normal. By
analyzing the results respectively obtained from these two classes of modelling
scheme, we argue that the direct dependence (Gamma-Beta) model
behaves better, in estimating the recovery rate given individual probability
of default and in suggesting a better indication of their relationship. Finally,
we apply default correlation and the correlated recovery rate to portfolio risk
modelling. We conclude that if the recovery rates are independent stochastic
variables, the expected losses in a large portfolio might be underestimated
because the uncorrelated recovery risks can be diversified, so the correlation
between default rate and recovery risk can not be neglected in the applications.
Here, we believe the first time, the recovery rate depends on individual
default probability by means of a closed formula
Object Detection in Videos with Tubelet Proposal Networks
Object detection in videos has drawn increasing attention recently with the
introduction of the large-scale ImageNet VID dataset. Different from object
detection in static images, temporal information in videos is vital for object
detection. To fully utilize temporal information, state-of-the-art methods are
based on spatiotemporal tubelets, which are essentially sequences of associated
bounding boxes across time. However, the existing methods have major
limitations in generating tubelets in terms of quality and efficiency.
Motion-based methods are able to obtain dense tubelets efficiently, but the
lengths are generally only several frames, which is not optimal for
incorporating long-term temporal information. Appearance-based methods, usually
involving generic object tracking, could generate long tubelets, but are
usually computationally expensive. In this work, we propose a framework for
object detection in videos, which consists of a novel tubelet proposal network
to efficiently generate spatiotemporal proposals, and a Long Short-term Memory
(LSTM) network that incorporates temporal information from tubelet proposals
for achieving high object detection accuracy in videos. Experiments on the
large-scale ImageNet VID dataset demonstrate the effectiveness of the proposed
framework for object detection in videos.Comment: CVPR 201
Towards Robust Aspect-based Sentiment Analysis through Non-counterfactual Augmentations
While state-of-the-art NLP models have demonstrated excellent performance for
aspect based sentiment analysis (ABSA), substantial evidence has been presented
on their lack of robustness. This is especially manifested as significant
degradation in performance when faced with out-of-distribution data. Recent
solutions that rely on counterfactually augmented datasets show promising
results, but they are inherently limited because of the lack of access to
explicit causal structure. In this paper, we present an alternative approach
that relies on non-counterfactual data augmentation. Our proposal instead
relies on using noisy, cost-efficient data augmentations that preserve
semantics associated with the target aspect. Our approach then relies on
modelling invariances between different versions of the data to improve
robustness. A comprehensive suite of experiments shows that our proposal
significantly improves upon strong pre-trained baselines on both standard and
robustness-specific datasets. Our approach further establishes a new
state-of-the-art on the ABSA robustness benchmark and transfers well across
domains.Comment: 10pages,1 figure,10 table
Observational constraints on cosmic neutrinos and dark energy revisited
Using several cosmological observations, i.e. the cosmic microwave background
anisotropies (WMAP), the weak gravitational lensing (CFHTLS), the measurements
of baryon acoustic oscillations (SDSS+WiggleZ), the most recent observational
Hubble parameter data, the Union2.1 compilation of type Ia supernovae, and the
HST prior, we impose constraints on the sum of neutrino masses (\mnu), the
effective number of neutrino species (\neff) and dark energy equation of
state (), individually and collectively. We find that a tight upper limit on
\mnu can be extracted from the full data combination, if \neff and are
fixed. However this upper bound is severely weakened if \neff and are
allowed to vary. This result naturally raises questions on the robustness of
previous strict upper bounds on \mnu, ever reported in the literature. The
best-fit values from our most generalized constraint read
\mnu=0.556^{+0.231}_{-0.288}\rm eV, \neff=3.839\pm0.452, and
at 68% confidence level, which shows a firm lower limit on
total neutrino mass, favors an extra light degree of freedom, and supports the
cosmological constant model. The current weak lensing data are already helpful
in constraining cosmological model parameters for fixed . The dataset of
Hubble parameter gains numerous advantages over supernovae when ,
particularly its illuminating power in constraining \neff. As long as is
included as a free parameter, it is still the standardizable candles of type Ia
supernovae that play the most dominant role in the parameter constraints.Comment: 39 pages, 15 figures, 7 tables, accepted to JCA
- …